mg2vec: Learning Relationship-Preserving Heterogeneous Graph Representations via Metagraph Embedding
نویسندگان
چکیده
منابع مشابه
Learning Graph Representations with Embedding Propagation
Label Representations • Let l ∈ Rd be the representation of label l, and f be a differentiable embedding function • For labels of label type i, we apply a learnable embedding function l = fi(l) • hi(v) is the embedding of label type i for vertex v: hi(v) = gi ({l | l ∈ labels of type i associated with vertex v}) • h̃i(v) is the reconstruction of the embedding of label type i for vertex v: h̃i(v) ...
متن کاملGraph attribute embedding via Riemannian submersion learning
In this paper, we tackle the problem of embedding a set of relational structures into a metric space for purposes of matching and categorisation. To this end, we view the problem from a Riemannian perspective and make use of the concepts of charts on the manifold to define the embedding as a mixture of class-specific submersions. Formulated in this manner, the mixture weights are recovered usin...
متن کاملGraph Embedding aided Relationship Prediction in Heterogeneous Networks
We consider the problem of predicting relationships in largescale heterogeneous networks. For example, one can try to predict if a researcher will publish at a conference (eg: VLDB) given her previous publications, or try to anticipate if two reputed researchers working in the same area will collaborate. The main challenge is to extract latent information from such real-world networks which are...
متن کاملEmbedding Heterogeneous Data by Preserving Multiple Kernels
Heterogeneous data may arise in many real-life applications under different scenarios. In this paper, we formulate a general framework to address the problem of modeling heterogeneous data. Our main contribution is a novel embedding method, called multiple kernel preserving embedding (MKPE), which projects heterogeneous data into a unified embedding space by preserving crossdomain interactions ...
متن کاملWord Representations via Gaussian Embedding
Current work in lexical distributed representations maps each word to a point vector in low-dimensional space. Mapping instead to a density provides many interesting advantages, including better capturing uncertainty about a representation and its relationships, expressing asymmetries more naturally than dot product or cosine similarity, and enabling more expressive parameterization of decision...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Knowledge and Data Engineering
سال: 2020
ISSN: 1041-4347,1558-2191,2326-3865
DOI: 10.1109/tkde.2020.2992500